Sequential Conditional Generalized Iterative Scaling

نویسنده

  • Joshua Goodman
چکیده

We describe a speedup for training conditional maximum entropy models. The algorithm is a simple variation on Generalized Iterative Scaling, but converges roughly an order of magnitude faster, depending on the number of constraints, and the way speed is measured. Rather than attempting to train all model parameters simultaneously, the algorithm trains them sequentially. The algorithm is easy to implement, typically uses only slightly more memory, and will lead to improvements for most maximum entropy problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Improved Iterative Scaling Algorithm A Gentle Introduction

This note concerns the improved iterative scaling algorithm for computing maximum likelihood estimates of the parameters of exponential models The algorithm was invented by members of the machine translation group at IBM s T J Watson Research Center in the early s The goal here is to motivate the improved iterative scaling algorithm for conditional models in a way that is as complete and self c...

متن کامل

A Faster Iterative Scaling Algorithm for Conditional Exponential Model

Conditional exponential model has been one of the widely used conditional models in machine learning field and improved iterative scaling (IIS) has been one of the major algorithms for finding the optimal parameters for the conditional exponential model. In this paper, we proposed a faster iterative algorithm named FIS that is able to find the optimal parameters faster than the IIS algorithm. T...

متن کامل

A Generalized Iterative Scaling Algorithm for Maximum Entropy Reasoning in Relational Probabilistic Conditional Logic Under Aggregation Semantics

Recently, different semantics for relational probabilistic conditionals and corresponding maximum entropy (ME) inference operators have been proposed. In this paper, we study the so-called aggregation semantics that covers both notions of a statistical and subjective view. The computation of its inference operator requires the calculation of the ME-distribution satisfying all probabilistic cond...

متن کامل

Accelerating generalized Iterative Scaling Based on Staggered Aitken Method for on-Line Conditional Random Fields

In this paper, a convergent method based on Generalized Iterative Scaling (GIS) with staggered Aitken acceleration is proposed to estimate the parameters for an on-line Conditional Random Field (CRF). The staggered Aitken acceleration method, which alternates between the acceleration and non-acceleration steps, ensures computational simplicity when analyzing incomplete data. The proposed method...

متن کامل

Language model adaptation using minimum discrimination information

In this paper, adaptation of language models using the minimum discrimination information criteria is presented. Language model probabilities are adapted based on unigram, bigram and trigram features using a modified version of the generalized iterative scaling algorithm. Furthermore, a language model compression algorithm, based on conditional relative entropy is discussed. It removes probabil...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002